Kappa Coefficient (κ)
Chance corrected agreement
Kappa Coefficient (κ) is the measure of inter-rater agreement for categorical items among two raters.
Kappa (κ) is similar to percent agreement but corrected for chance
(κ) can only be used for 2 categories, otherwise one should use Krippendorff’s alpha.
Calculation
Kappa (κ)
Scoring
Values range from 0.00 (not reliable) to 1.00 (perfectly reliable)
Score | Reliability |
---|---|
0.00 – 0.20 | Poor |
0.21 – 0.40 | Fair |
0.41 – 0.60 | Moderate |
0.61 – 0.80 | Good |
0.81 – 1.00 | Excellent |
Citation
For attribution, please cite this work as:
Yomogida N, Kerstein C. Kappa Coefficient
(κ). https://yomokerst.com/The
Archive/Evidene Based Practice/Reliability/Reliability
Tests/kappa_coefficient.html